Skip to content

Added TF specific documentation to DistributedEmbedding. #94

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 21, 2025

Conversation

hertschuh
Copy link
Collaborator

No description provided.

@hertschuh hertschuh force-pushed the embedding_tf branch 3 times, most recently from 4939b13 to 2bd6f5a Compare May 21, 2025 20:14
Copy link
Collaborator

@cantonios cantonios left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approval with a couple minor nits in-line.


In addition to `tf.Tensor`, `DistributedEmbedding` accepts `tf.RaggedTensor`
and `tf.SparseTensor` as inputs for the embedding lookups. Ragged tensors
must be ragged in dimension 1. Note that if weights are passed, each weight
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is dimension 1 the dimension with index 0? Does TF actually support any other kind of raggedness?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I meant dimension with index 1. TF supports any ragged dimension and in fact multiple ragged dimensions, a tensor can be twice ragged.

https://www.tensorflow.org/api_docs/python/tf/RaggedTensor#multiple_ragged_dimensions

https://www.tensorflow.org/api_docs/python/tf/RaggedTensor#attributes (see ragged_rank)

In addition to `tf.Tensor`, `DistributedEmbedding` accepts `tf.RaggedTensor`
and `tf.SparseTensor` as inputs for the embedding lookups. Ragged tensors
must be ragged in dimension 1. Note that if weights are passed, each weight
tensor must be of the same type as the inputs for that particular feature
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"same type" might be a bit confusing because we expect the datatype for indices to be integers, but weights to be floats.

Same class?

@hertschuh hertschuh merged commit 31d881b into keras-team:main May 21, 2025
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants